Sparsistency of 1 -regularized M -estimators: Supplementary Material 1 Auxiliary Result for the Non-structured Case

نویسندگان

  • Yen-Huan Li
  • Jonathan Scarlett
  • Pradeep Ravikumar
  • Volkan Cevher
چکیده

The proof is based on the optimality conditions on β̂ for the original problem, and those on β̌ for the restricted problem. We first observe that β̌n exists, since the function x 7→ ‖x‖1 is coercive. Recall that β̌n is assumed to be uniquely defined. To achieve sparsistency, it suffices that β̂n = β̌n and supp β̌n = suppβ ∗. We derive sufficient conditions for β̂n = β̌n in Lemma 2.1, and make this sufficient condition explicitly dependent on the problem parameters in Lemma 2.2. This lemma will require that ∥∥β̌n − β∗∥∥2 ≤ Rn for some Rn > 0. We will derive an estimation error bound of the form ∥∥β̌n − β∗∥∥2 ≤ rn in Lemma 2.4. We will then conclude that β̂n = β̌n if rn ≤ Rn and the assumptions in Lemma 2.2 are satisfied, from which it will follow that sign β̌ = signβ∗ provided that βmin ≥ rn.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Sparsistency of 1-Regularized M-Estimators

We consider the model selection consistency or sparsistency of a broad set of ` 1 regularized M -estimators for linear and nonlinear statistical models in a unified fashion. For this purpose, we propose the local structured smoothness condition (LSSC) on the loss function. We provide a general result giving deterministic su cient conditions for sparsistency in terms of the regularization parame...

متن کامل

On the Use of Variational Inference for Learning Discrete Graphical Model

We study the general class of estimators for graphical model structure based on optimizing !1-regularized approximate loglikelihood, where the approximate likelihood uses tractable variational approximations of the partition function. We provide a message-passing algorithm that directly computes the !1 regularized approximate MLE. Further, in the case of certain reweighted entropy approximation...

متن کامل

On the Use of Variational Inference for Learning Discrete Graphical Models

We study the general class of estimators for graphical model structure based on optimizing l1-regularized approximate loglikelihood, where the approximate likelihood uses tractable variational approximations of the partition function. We provide a message-passing algorithm that directly computes the l1 regularized approximate MLE. Further, in the case of certain reweighted entropy approximation...

متن کامل

N ov 2 00 7 Sparsistency and Rates of Convergence in Large Covariance Matrices Estimation ∗

This paper studies the sparsistency, rates of convergence, and asymptotic normality for estimating sparse covariance matrices based on penalized likelihood with non-concave penalty functions. Here, sparsistency refers to the property that all parameters that are zero are actually estimated as zero with probability tending to one. Depending on the case of applications, sparsity priori may occur ...

متن کامل

On Presentation a new Estimator for Estimating of Population Mean in the Presence of Measurement error and non-Response

Introduction According to the classic sampling theory, errors that are mainly considered in the estimations are sampling errors.  However, most non-sampling errors are more effective than sampling errors in properties of estimators. This has been confirmed by researchers over the past two decades, especially in relation to non-response errors that are one of the most fundamental non-immolation...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2015